329 research outputs found

    The Apology

    Get PDF

    Boundary scan system design

    Get PDF
    Given the strong competition in digital design on the national and international levels, boundary scan devices are rapidly becoming a necessary as opposed to a convenient feature on integrated circuits. This thesis serves a dual purpose. First, it demonstrates how boundary scan devices can be used to increase the testability of a circuit and it presents several factors used to quantify the cost associated with the addition of boundary scan compatibility to digital designs. Cost tradeoffs are often the most intimidating hurdle for engineers to cross when deciding if boundary scan compatibility is worth the effort. Second, it demonstrates the use of the Tektronix LV500 (logic verifier) as a general testing tool, using boundary scan designs as examples. These examples provide an understanding of the function of boundary scan cells and the JTAG/1 149. 1 standard. The LV500, which is used by students in the Department of Computer Engineering and Microelectronic Engineering at RIT, is an indispensable tool for making critical timing measurements. It also allows a user to evaluate and step through simple as well as more complicated designs. It is my hope that this thesis and the tutorial provided will facilitate the use of the LV500 in future testing work performed in RIT\u27s center for Microelectronic and Computer Engineering clean room facilities. Upon following the example circuits described, one should become familiar with boundary scan terminology as well as the methodology used in designing such a system

    The Apology

    Full text link

    The Radios of September

    Get PDF

    During the Fall

    Get PDF

    Using Numerical Dynamic Programming to Compare Passive and Active Learning in the Adaptive Management of Nutrients in Shallow Lakes

    Get PDF
    This paper illustrates the use of dual/adaptive control methods to compare passive and active adaptive management decisions in the context of an ecosystem with a threshold effect. Using discrete-time dynamic programming techniques, we model optimal phosphorus loadings under both uncertainty about natural loadings and uncertainty regarding the critical level of phosphorus concentrations beyond which nutrient recycling begins. Active management is modeled by including the anticipated value of information (or learning) in the structure of the problem, and thus the agent can perturb the system (experiment), update beliefs, and learn about the uncertain parameter. Using this formulation, we define and value optimal experimentation both ex ante and ex post. Our simulation results show that experimentation is optimal over a large range of phosphorus concentration and belief space, though ex ante benefits are small. Furthermore, realized benefits may critically depend on the true underlying parameters of the problem.adaptive control, adaptive management, dynamic programming, value of experimentation, value of information, nonpoint source pollution, learning, decisions under uncertainty, Resource /Energy Economics and Policy,

    Visible camera cryostat design and performance for the SuMIRe Prime Focus Spectrograph (PFS)

    Full text link
    We describe the design and performance of the SuMIRe Prime Focus Spectrograph (PFS) visible camera cryostats. SuMIRe PFS is a massively multi-plexed ground-based spectrograph consisting of four identical spectrograph modules, each receiving roughly 600 fibers from a 2394 fiber robotic positioner at the prime focus. Each spectrograph module has three channels covering wavelength ranges 380~nm -- 640~nm, 640~nm -- 955~nm, and 955~nm -- 1.26~um, with the dispersed light being imaged in each channel by a f/1.07 vacuum Schmidt camera. The cameras are very large, having a clear aperture of 300~mm at the entrance window, and a mass of ∼\sim280~kg. In this paper we describe the design of the visible camera cryostats and discuss various aspects of cryostat performance

    Data Reduction Pipeline for the CHARIS Integral-Field Spectrograph I: Detector Readout Calibration and Data Cube Extraction

    Get PDF
    We present the data reduction pipeline for CHARIS, a high-contrast integral-field spectrograph for the Subaru Telescope. The pipeline constructs a ramp from the raw reads using the measured nonlinear pixel response, and reconstructs the data cube using one of three extraction algorithms: aperture photometry, optimal extraction, or χ2\chi^2 fitting. We measure and apply both a detector flatfield and a lenslet flatfield and reconstruct the wavelength- and position-dependent lenslet point-spread function (PSF) from images taken with a tunable laser. We use these measured PSFs to implement a χ2\chi^2-based extraction of the data cube, with typical residuals of ~5% due to imperfect models of the undersampled lenslet PSFs. The full two-dimensional residual of the χ2\chi^2 extraction allows us to model and remove correlated read noise, dramatically improving CHARIS' performance. The χ2\chi^2 extraction produces a data cube that has been deconvolved with the line-spread function, and never performs any interpolations of either the data or the individual lenslet spectra. The extracted data cube also includes uncertainties for each spatial and spectral measurement. CHARIS' software is parallelized, written in Python and Cython, and freely available on github with a separate documentation page. Astrometric and spectrophotometric calibrations of the data cubes and PSF subtraction will be treated in a forthcoming paper.Comment: 18 pages, 15 figures, 3 tables, replaced with JATIS accepted version (emulateapj formatted here). Software at https://github.com/PrincetonUniversity/charis-dep and documentation at http://princetonuniversity.github.io/charis-de
    • …
    corecore